650 research outputs found
Should mentoring be routinely introduced into general dental practice to reduce the risk of occupational stress?
Introduction: Occupational stress within general dental practice can potentially have an adverse impact on a practitioner's wellbeing and the quality of healthcare provided by that individual. Mentoring has routinely been utilised in other professions for stress management, however, there is little in the dental literature discussing the benefits of mentorship on the reduction of occupational stress for dental practitioners.
Aim: The aim of this study was to explore the perceptions of experienced foundation dental trainers within the Health Education, Kent, Surrey and Sussex postgraduate deanery as to the usefulness of routine mentoring as a tool to reduce occupational stress.
Methods: Using a qualitative approach, six individual semi-structured interviews were undertaken. Recorded interviews were transcribed and transcriptions were analysed using thematic coding to identify overarching themes.
Results: Both similarities and differences with the existing literature on routine mentoring within professional settings were identified. Foundation dental trainers were positive towards the concept of routine mentoring, although there was also a degree of scepticism regarding the potential uptake among colleagues. There was a perception that mentoring might more practically be used as a reactionary tool. Multiple potential barriers to routine mentoring were identified, included funding, scheduling and a lack of training.
Conclusions: The analysis identified that currently, experienced foundation dental practitioners do not consider routine mentoring as a practical option in the prevention of occupational stress. The results would suggest that further education is required as to the benefits of routine mentoring as a strategy for occupational stress management. However, with additional resources buying time, a hybrid model of mentoring and coaching has significant potential in general dental practice
N=1* in 5 dimensions: Dijkgraaf-Vafa meets Polchinski-Strassler
One of the powerful techniques to analyze the 5 dimensional Super Yang Mills
theory with a massive hypermultiplet (N=1*) is provided by the AdS/CFT
correspondence. It predicts that, for certain special values of the
hypermultiplet mass, this theory develops nonperturbative branches of the
moduli space as well as new light degrees of freedom.
We use the higher dimensional generalization of the matrix model/gauge theory
correspondence and recover all the prediction of the supergravity analysis. We
construct the map between the four dimensional holomorphic superpotential and
the five dimensional action and explicitly show that the superpotential is flat
along the nonperturbative branches. This is the first instance in which the
Dijkgraaf-Vafa method is used to analyze intrinsically higher dimensional
phenomena.Comment: 28 pages, Late
Boundary States for D-branes with Traveling Waves
We construct boundary states for D-branes which carry traveling waves in the
covariant formalism. We compute their vacuum amplitudes to investigate their
interactions. In non-compact space, the vacuum amplitudes become trivial as is
common in plane wave geometries. However, we found that if they are
compactified in the traveling direction, then the amplitudes are affected by
non-trivial time dependent effects. The interaction between D-branes with waves
traveling in the opposite directions (`pulse-antipulse scattering') are also
computed. Furthermore, we apply these ideas to open string tachyon condensation
with traveling waves.Comment: 30 pages. 1 figure, Latex, minor corrections, references adde
Lamotrigine versus inert placebo in the treatment of borderline personality disorder: study protocol for a randomized controlled trial and economic evaluation
Background: People with borderline personality disorder (BPD) experience rapid and distressing changes in mood, poor social functioning and have high rates of suicidal behaviour. Several small scale studies suggest that mood stabilizers may produce short-term reductions in symptoms of BPD, but have not been large enough to fully examine clinical and cost-effectiveness. Methods/Design: A two parallel-arm, placebo controlled randomized trial of usual care plus either lamotrigine or an inert placebo for people aged over 18 who are using mental health services and meet diagnostic criteria for BPD. We will exclude people with comorbid bipolar affective disorder or psychosis, those already taking a mood stabilizer, those who speak insufficient English to complete the baseline assessment and women who are pregnant or contemplating becoming pregnant. Those meeting inclusion criteria and provide written informed consent will be randomized to up to 200mg of lamotrigine per day or an inert placebo (up to 400mg if taking combined oral contraceptives).Participants will be randomized via a remote web-based system using permuted stacked blocks stratified by study centre, severity of personality disorder, and level of bipolarity. Follow-up assessments will be conducted by masked researchers 12, 24 weeks, and 52 weeks after randomization. The primary outcome is the Zanarini Rating Scale for Borderline Personality Disorder (ZAN-BPD). The secondary outcomes are depressive symptoms, deliberate self-harm, social functioning, health-related quality of life, resource use and costs, side effects of treatment, adverse events and withdrawal of trial medication due to adverse effects. The main analyses will use intention to treat without imputation of missing data. The economic evaluation will take an NHS/Personal Social Services perspective. A cost-utility analysis will compare differences in total costs and differences in quality of life using QALYs derived from the EQ-5D. Discussion: The evidence base for the use of pharmacological treatments for people with borderline personality disorder is poor. In this trial we will examine the clinical and cost-effectiveness of lamotrigine to assess what if any impact offering this has on peoplesâ mental health, social functioning, and use of other medication and other resources. Trial registration: Current Controlled Trials ISRCTN90916365 (registered 01/08/2012
Event-by-Event Fluctuations of Particle Ratios in Central Pb+Pb Collisions at 20 to 158 AGeV
In the vicinity of the QCD phase transition, critical fluctuations have been
predicted to lead to non-statistical fluctuations of particle ratios, depending
on the nature of the phase transition. Recent results of the NA49 energy scan
program show a sharp maximum of the ratio of K+ to Pi+ yields in central Pb+Pb
collisions at beam energies of 20-30 AGeV. This observation has been
interpreted as an indication of a phase transition at low SPS energies. We
present first results on event-by-event fluctuations of the kaon to pion and
proton to pion ratios at beam energies close to this maximum.Comment: 4 pages, 4 figures, Quark Matter 2004 proceeding
Quotients of AdS_{p+1} x S^q: causally well-behaved spaces and black holes
Starting from the recent classification of quotients of Freund--Rubin
backgrounds in string theory of the type AdS_{p+1} x S^q by one-parameter
subgroups of isometries, we investigate the physical interpretation of the
associated quotients by discrete cyclic subgroups. We establish which quotients
have well-behaved causal structures, and of those containing closed timelike
curves, which have interpretations as black holes. We explain the relation to
previous investigations of quotients of asymptotically flat spacetimes and
plane waves, of black holes in AdS and of Godel-type universes.Comment: 48 pages; v2: minor typos correcte
Ultra-High Energy Neutrino Fluxes and Their Constraints
Applying our recently developed propagation code we review extragalactic
neutrino fluxes above 10^{14} eV in various scenarios and how they are
constrained by current data. We specifically identify scenarios in which the
cosmogenic neutrino flux, produced by pion production of ultra high energy
cosmic rays outside their sources, is considerably higher than the
"Waxman-Bahcall bound". This is easy to achieve for sources with hard injection
spectra and luminosities that were higher in the past. Such fluxes would
significantly increase the chances to detect ultra-high energy neutrinos with
experiments currently under construction or in the proposal stage.Comment: 11 pages, 15 figures, version published in Phys.Rev.
Making maps from Planck LFI 30 GHz data with asymmetric beams and cooler noise
The Planck satellite will observe the full sky at nine frequencies from 30 to 857 GHz. Temperature and polarization frequency maps made from these observations are prime deliverables of the Planck mission. The goal of this paper is to examine the effects of four realistic instrument systematics in the 30 GHz frequency maps: non-axially-symmetric beams, sample integration, sorption cooler noise, and pointing errors. We simulated one-year long observations of four 30 GHz detectors. The simulated timestreams contained cosmic microwave background (CMB) signal, foreground components ( both galactic and extra-galactic), instrument noise ( correlated and white), and the four instrument systematic effects. We made maps from the timelines and examined the magnitudes of the systematics effects in the maps and their angular power spectra. We also compared the maps of different mapmaking codes to see how they performed. We used five mapmaking codes ( two destripers and three optimal codes). None of our mapmaking codes makes any attempt to deconvolve the beam from its output map. Therefore all our maps had similar smoothing due to beams and sample integration. This is a complicated smoothing, because each map pixel has its own effective beam. Temperature to polarization cross-coupling due to beam mismatch causes a detectable bias in the TE spectrum of the CMB map. The effects of cooler noise and pointing errors did not appear to be major concerns for the 30 GHz channel. The only essential difference found so far between mapmaking codes that affects accuracy ( in terms of residual root-mean-square) is baseline length. All optimal codes give essentially indistinguishable results. A destriper gives the same result as the optimal codes when the baseline is set short enough ( Madam). For longer baselines destripers (Springtide and Madam) require less computing resources but deliver a noisier map.The Planck satellite will observe the full sky at nine frequencies from 30 to 857 GHz. Temperature and polarization frequency maps made from these observations are prime deliverables of the Planck mission. The goal of this paper is to examine the effects of four realistic instrument systematics in the 30 GHz frequency maps: non-axially-symmetric beams, sample integration, sorption cooler noise, and pointing errors. We simulated one-year long observations of four 30 GHz detectors. The simulated timestreams contained cosmic microwave background (CMB) signal, foreground components ( both galactic and extra-galactic), instrument noise ( correlated and white), and the four instrument systematic effects. We made maps from the timelines and examined the magnitudes of the systematics effects in the maps and their angular power spectra. We also compared the maps of different mapmaking codes to see how they performed. We used five mapmaking codes ( two destripers and three optimal codes). None of our mapmaking codes makes any attempt to deconvolve the beam from its output map. Therefore all our maps had similar smoothing due to beams and sample integration. This is a complicated smoothing, because each map pixel has its own effective beam. Temperature to polarization cross-coupling due to beam mismatch causes a detectable bias in the TE spectrum of the CMB map. The effects of cooler noise and pointing errors did not appear to be major concerns for the 30 GHz channel. The only essential difference found so far between mapmaking codes that affects accuracy ( in terms of residual root-mean-square) is baseline length. All optimal codes give essentially indistinguishable results. A destriper gives the same result as the optimal codes when the baseline is set short enough ( Madam). For longer baselines destripers (Springtide and Madam) require less computing resources but deliver a noisier map.The Planck satellite will observe the full sky at nine frequencies from 30 to 857 GHz. Temperature and polarization frequency maps made from these observations are prime deliverables of the Planck mission. The goal of this paper is to examine the effects of four realistic instrument systematics in the 30 GHz frequency maps: non-axially-symmetric beams, sample integration, sorption cooler noise, and pointing errors. We simulated one-year long observations of four 30 GHz detectors. The simulated timestreams contained cosmic microwave background (CMB) signal, foreground components ( both galactic and extra-galactic), instrument noise ( correlated and white), and the four instrument systematic effects. We made maps from the timelines and examined the magnitudes of the systematics effects in the maps and their angular power spectra. We also compared the maps of different mapmaking codes to see how they performed. We used five mapmaking codes ( two destripers and three optimal codes). None of our mapmaking codes makes any attempt to deconvolve the beam from its output map. Therefore all our maps had similar smoothing due to beams and sample integration. This is a complicated smoothing, because each map pixel has its own effective beam. Temperature to polarization cross-coupling due to beam mismatch causes a detectable bias in the TE spectrum of the CMB map. The effects of cooler noise and pointing errors did not appear to be major concerns for the 30 GHz channel. The only essential difference found so far between mapmaking codes that affects accuracy ( in terms of residual root-mean-square) is baseline length. All optimal codes give essentially indistinguishable results. A destriper gives the same result as the optimal codes when the baseline is set short enough ( Madam). For longer baselines destripers (Springtide and Madam) require less computing resources but deliver a noisier map.Peer reviewe
Planck intermediate results. VIII. Filaments between interacting clusters
About half of the baryons of the Universe are expected to be in the form of
filaments of hot and low density intergalactic medium. Most of these baryons
remain undetected even by the most advanced X-ray observatories which are
limited in sensitivity to the diffuse low density medium. The Planck satellite
has provided hundreds of detections of the hot gas in clusters of galaxies via
the thermal Sunyaev-Zel'dovich (tSZ) effect and is an ideal instrument for
studying extended low density media through the tSZ effect. In this paper we
use the Planck data to search for signatures of a fraction of these missing
baryons between pairs of galaxy clusters. Cluster pairs are good candidates for
searching for the hotter and denser phase of the intergalactic medium (which is
more easily observed through the SZ effect). Using an X-ray catalogue of
clusters and the Planck data, we select physical pairs of clusters as
candidates. Using the Planck data we construct a local map of the tSZ effect
centered on each pair of galaxy clusters. ROSAT data is used to construct X-ray
maps of these pairs. After having modelled and subtracted the tSZ effect and
X-ray emission for each cluster in the pair we study the residuals on both the
SZ and X-ray maps. For the merging cluster pair A399-A401 we observe a
significant tSZ effect signal in the intercluster region beyond the virial
radii of the clusters. A joint X-ray SZ analysis allows us to constrain the
temperature and density of this intercluster medium. We obtain a temperature of
kT = 7.1 +- 0.9, keV (consistent with previous estimates) and a baryon density
of (3.7 +- 0.2)x10^-4, cm^-3. The Planck satellite mission has provided the
first SZ detection of the hot and diffuse intercluster gas.Comment: Accepted by A&
Planck 2015 results. XXVII. The Second Planck Catalogue of Sunyaev-Zeldovich Sources
We present the all-sky Planck catalogue of Sunyaev-Zeldovich (SZ) sources detected from the 29 month full-mission data. The catalogue (PSZ2) is the largest SZ-selected sample of galaxy clusters yet produced and the deepest all-sky catalogue of galaxy clusters. It contains 1653 detections, of which 1203 are confirmed clusters with identified counterparts in external data-sets, and is the first SZ-selected cluster survey containing > confirmed clusters. We present a detailed analysis of the survey selection function in terms of its completeness and statistical reliability, placing a lower limit of 83% on the purity. Using simulations, we find that the Y5R500 estimates are robust to pressure-profile variation and beam systematics, but accurate conversion to Y500 requires. the use of prior information on the cluster extent. We describe the multi-wavelength search for counterparts in ancillary data, which makes use of radio, microwave, infra-red, optical and X-ray data-sets, and which places emphasis on the robustness of the counterpart match. We discuss the physical properties of the new sample and identify a population of low-redshift X-ray under- luminous clusters revealed by SZ selection. These objects appear in optical and SZ surveys with consistent properties for their mass, but are almost absent from ROSAT X-ray selected samples
- âŠ